Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
1.3B lightweight and efficient
# 1.3B lightweight and efficient
Llama 1B Dj Refine 150B
Apache-2.0
Based on the OpenLLaMA architecture, this large language model is pre-trained on Data-Juicer refined RedPajama and Pile datasets, outperforming other models of the same 1.3B parameter scale.
Large Language Model
Transformers
L
datajuicer
2,834
2
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase